international authority
AI will eventually need an international authority, OpenAI leaders say
Sam Altman, the CEO of artificial intelligence lab OpenAI, told a Senate panel he welcomes federal regulation on the technology "to mitigate" its risks. The artificial intelligence field needs an international watchdog to regulate future superintelligence, according to the founder of OpenAI. In a blog post from CEO Sam Altman and company leaders Greg Brockman and Ilya Sutskever, the group said – given potential existential risk – the world "can't just be reactive," comparing the tech to nuclear energy. To that end, they suggested coordination among leading development efforts, highlighting that there are "many ways this could be implemented," including a project set up by major governments or curbs on annual growth rates. "Second, we are likely to eventually need something like an IAEA for superintelligence efforts; any effort above a certain capability (or resources like compute) threshold will need to be subject to an international authority that can inspect systems, require audits, test for compliance with safety standards, place restrictions on degrees of deployment and levels of security, etc." they asserted.
- North America > United States > District of Columbia > Washington (0.05)
- Europe > United Kingdom > England > Greater London > London (0.05)
- Law > Statutes (1.00)
- Government (1.00)
- Energy > Power Industry > Utilities > Nuclear (0.72)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.97)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.97)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.86)